52 research outputs found

    Channel Detection in Coded Communication

    Full text link
    We consider the problem of block-coded communication, where in each block, the channel law belongs to one of two disjoint sets. The decoder is aimed to decode only messages that have undergone a channel from one of the sets, and thus has to detect the set which contains the prevailing channel. We begin with the simplified case where each of the sets is a singleton. For any given code, we derive the optimum detection/decoding rule in the sense of the best trade-off among the probabilities of decoding error, false alarm, and misdetection, and also introduce sub-optimal detection/decoding rules which are simpler to implement. Then, various achievable bounds on the error exponents are derived, including the exact single-letter characterization of the random coding exponents for the optimal detector/decoder. We then extend the random coding analysis to general sets of channels, and show that there exists a universal detector/decoder which performs asymptotically as well as the optimal detector/decoder, when tuned to detect a channel from a specific pair of channels. The case of a pair of binary symmetric channels is discussed in detail.Comment: Submitted to IEEE Transactions on Information Theor

    On the Reliability Function of Distributed Hypothesis Testing Under Optimal Detection

    Full text link
    The distributed hypothesis testing problem with full side-information is studied. The trade-off (reliability function) between the two types of error exponents under limited rate is studied in the following way. First, the problem is reduced to the problem of determining the reliability function of channel codes designed for detection (in analogy to a similar result which connects the reliability function of distributed lossless compression and ordinary channel codes). Second, a single-letter random-coding bound based on a hierarchical ensemble, as well as a single-letter expurgated bound, are derived for the reliability of channel-detection codes. Both bounds are derived for a system which employs the optimal detection rule. We conjecture that the resulting random-coding bound is ensemble-tight, and consequently optimal within the class of quantization-and-binning schemes

    On the VC-Dimension of Binary Codes

    Full text link
    We investigate the asymptotic rates of length-nn binary codes with VC-dimension at most dndn and minimum distance at least δn\delta n. Two upper bounds are obtained, one as a simple corollary of a result by Haussler and the other via a shortening approach combining Sauer-Shelah lemma and the linear programming bound. Two lower bounds are given using Gilbert-Varshamov type arguments over constant-weight and Markov-type sets

    Expurgated Bounds for the Asymmetric Broadcast Channel

    Full text link
    This work contains two main contributions concerning the expurgation of hierarchical ensembles for the asymmetric broadcast channel. The first is an analysis of the optimal maximum likelihood (ML) decoders for the weak and strong user. Two different methods of code expurgation will be used, that will provide two competing error exponents. The second is the derivation of expurgated exponents under the generalized stochastic likelihood decoder (GLD). We prove that the GLD exponents are at least as tight as the maximum between the random coding error exponents derived in an earlier work by Averbuch and Merhav (2017) and one of our ML-based expurgated exponents. By that, we actually prove the existence of hierarchical codebooks that achieve the best of the random coding exponent and the expurgated exponent simultaneously for both users

    Guessing with a Bit of Help

    Full text link
    What is the value of a single bit to a guesser? We study this problem in a setup where Alice wishes to guess an i.i.d. random vector, and can procure one bit of information from Bob, who observes this vector through a memoryless channel. We are interested in the guessing efficiency, which we define as the best possible multiplicative reduction in Alice's guessing-moments obtainable by observing Bob's bit. For the case of a uniform binary vector observed through a binary symmetric channel, we provide two lower bounds on the guessing efficiency by analyzing the performance of the Dictator and Majority functions, and two upper bounds via maximum entropy and Fourier-analytic / hypercontractivity arguments. We then extend our maximum entropy argument to give a lower bound on the guessing efficiency for a general channel with a binary uniform input, via the strong data-processing inequality constant of the reverse channel. We compute this bound for the binary erasure channel, and conjecture that Greedy Dictator functions achieve the guessing efficiency

    Self-Predicting Boolean Functions

    Full text link
    A Boolean function gg is said to be an optimal predictor for another Boolean function ff, if it minimizes the probability that f(Xn)≠g(Yn)f(X^{n})\neq g(Y^{n}) among all functions, where XnX^{n} is uniform over the Hamming cube and YnY^{n} is obtained from XnX^{n} by independently flipping each coordinate with probability δ\delta. This paper is about self-predicting functions, which are those that coincide with their optimal predictor
    • …
    corecore